翻訳と辞書
Words near each other
・ Enuresis
・ Enurmino
・ Enusonai
・ ENV
・ Env
・ Env (disambiguation)
・ Env (gene)
・ Env3D
・ Envaatnags Eflos Solf Esgantaavne
・ Enval
・ Entropy (video game)
・ Entropy / Send Them
・ Entropy and life
・ Entropy compression
・ Entropy encoding
Entropy estimation
・ Entropy exchange
・ Entropy in thermodynamics and information theory
・ Entropy maximization
・ Entropy monitoring
・ Entropy of activation
・ Entropy of entanglement
・ Entropy of fusion
・ Entropy of mixing
・ Entropy of vaporization
・ Entropy power inequality
・ Entropy production
・ Entropy rate
・ Entrust
・ Entrust Bankcard Company


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Entropy estimation : ウィキペディア英語版
Entropy estimation
In various science/engineering applications, such as independent component analysis,〔Dinh-Tuan Pham (2004) Fast algorithms for mutual information based independent component analysis. In ''Signal Processing''. Volume 52, Issue 10, 2690–2700, 〕 image analysis,〔Chang, C.-I.; Du, Y.; Wang, J.; Guo, S.-M.; Thouin, P.D. (2006) Survey and comparative analysis of entropy and relative entropy thresholding techniques. In ''Vision, Image and Signal Processing'', Volume 153, Issue 6, 837–850, 〕 genetic analysis,〔Martins, D. C. ''et al.'' (2008) Intrinsically Multivariate Predictive Genes. In ''Selected Topics in Signal Processing''. Volume 2, Issue 3, 424–439, 〕 speech recognition,〔Gue Jun Jung; Yung-Hwan Oh (2008) Information Distance-Based Subvector Clustering for ASR Parameter Quantization. In ''Signal Processing Letters'', Volume 15, 209–212, 〕 manifold learning,〔Costa, J.A.; Hero, A.O. (2004), Geodesic entropic graphs for dimension and entropy estimation in manifold learning. In ''Signal Processing'', Volume 52, Issue 8, 2210–2221, 〕 and time delay estimation〔Benesty, J.; Yiteng Huang; Jingdong Chen (2007) Time Delay Estimation via Minimum Entropy. In ''Signal Processing Letters'', Volume 14, Issue 3, March 2007 157–160 〕 it is useful to estimate the differential entropy of a system or process, given some observations.
The simplest and most common approach uses histogram-based estimation, but other approaches have been developed and used, each with their own benefits and drawbacks.〔J. Beirlant, E. J. Dudewicz, L. Gyorfi, and E. C. van der Meulen (1997) (Nonparametric entropy estimation: An overview ). In ''International Journal of Mathematical and Statistical Sciences'', Volume 6, pp. 17–
39.〕 The main factor in choosing a method is often a trade-off between the bias and the variance of the estimate〔T. Schürmann, Bias analysis in entropy estimation. In ''J. Phys. A: Math. Gen'', 37 (2004), pp. L295–L301. 〕 although the nature of the (suspected) distribution of the data may also be a factor.〔
==Histogram estimator==
The histogram approach uses the idea that the differential entropy,
:H(X) = -\int_\mathbb f(x)\log f(x)\,dx
can be approximated by producing a histogram of the observations, and then finding the discrete entropy
:
H(X) = - \sum_^nf(x_i)\log \left(\frac \right)

of that histogram (which is itself a maximum-likelihood (ML) estimate of the discretized frequency distribution), where ''w'' is the width of the ''i''th bin. Histograms can be quick to calculate, and simple, so this approach has some attractions. However, the estimate produced is biased, and although corrections can be made to the estimate, they may not always be satisfactory.〔G. Miller (1955) Note on the bias of information estimates. In ''Information Theory in Psychology: Problems and Methods'', pp. 95–100.〕
A method better suited for multidimensional probability density functions (pdf) is to first make a pdf estimate with some method, and then, from the pdf estimate, compute the entropy. A useful pdf estimate method is e.g. Gaussian mixture modeling (GMM), where the expectation maximization (EM) algorithm is used to find an ML estimate of a weighted sum of Gaussian pdf's approximating the data pdf.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Entropy estimation」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.